Second-order normal vectors to a convex epigraph

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Second-order complex random vectors and normal distributions

We formulate as a deconvolution problem thecausalhoncausal non-Gaussian multichannel autoregressive (AR)parameter estimation problem. The super exponential aljporithmpresented in a recent paper by Shalvi and Weinstein is generalizedto the vector case. We present an adaptive implementation that isvery attractive since it is higher order statistics (HOS) based b u t doesno...

متن کامل

Epigraph projections for fast general convex programming

This paper develops an approach for efficiently solving general convex optimization problems specified as disciplined convex programs (DCP), a common general-purpose modeling framework. Specifically we develop an algorithm based upon fast epigraph projections, projections onto the epigraph of a convex function, an approach closely linked to proximal operator methods. We show that by using these...

متن کامل

Epigraph proximal algorithms for general convex programming

This work aims at partially bridging the gap between (generic but slow) “generalpurpose” convex solvers and (fast but constricted) “specialized” solvers. We develop the Epsilon system, a solver capable of handling general convex problems (inputs are specified directly as disciplined convex programs); however, instead of transforming these problems to cone form, the compiler transforms them to a...

متن کامل

Second-order Characterizations of Convex and Pseudoconvex Functions

The present paper gives characterizations of radially u.s.c. convex and pseudoconvex functions f : X → R defined on a convex subset X of a real linear space E in terms of first and second-order upper Dini-directional derivatives. Observing that the property f radially u.s.c. does not require a topological structure of E, we draw the possibility to state our results for arbitrary real linear spa...

متن کامل

Second-Order Kernel Online Convex Optimization with Adaptive Sketching

Kernel online convex optimization (KOCO) is a framework combining the expressiveness of nonparametric kernel models with the regret guarantees of online learning. First-order KOCO methods such as functional gradient descent require onlyO(t) time and space per iteration, and, when the only information on the losses is their convexity, achieve a minimax optimal O( √ T ) regret. Nonetheless, many ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Bulletin of the Australian Mathematical Society

سال: 1994

ISSN: 0004-9727,1755-1633

DOI: 10.1017/s0004972700009631